Journal article

Focused Contrastive Loss for Classification With Pre-Trained Language Models

J He, Y Li, Z Zhai, B Fang, C Thorne, C Druckenbrodt, S Akhondi, K Verspoor

IEEE Transactions on Knowledge and Data Engineering | Published : 2024

Abstract

Contrastive learning, which learns data representations by contrasting similar and dissimilar instances, has achieved great success in various domains including natural language processing (NLP). Recently, it has been demonstrated that incorporating class labels into contrastive learning, i.e., supervised contrastive learning (SCL), can further enhance the quality of the learned data representations. Although several works have shown empirically that incorporating SCL into classification models leads to better performance, the mechanism of how SCL works for classification is less studied. In this paper, we first investigate how SCL facilitates the classifier learning, where we show that the ..

View full abstract

University of Melbourne Researchers